home *** CD-ROM | disk | FTP | other *** search
- Backpropagation Program
-
- 1. Purpose;
- a. Initialize a MLP using random initial weights
- b. Train a MLP network using backpropagation
-
- 2. Features;
- a. Train with or without momentum factor
- b. Train with or without batching (No simultaneous momentum and batching
- c. Adaptive learning factor
- d. Training MSE and error percentages are shown
- e. Feature importance can be measured
- f. Saves weights to disk
-
- 3. Example Run of Backpropagation Program
- a. Go to the "Batch Processing" option and press <ret>
- b. Observe the parameter file with commented keyboard responses;
-
-
- gls.top ! file storing network structure
- 2 ! 1 for existing weights, 2 for random initial weights
- gls ! filename for training data
- 0 ! Enter number of patterns to read (0 for all patterns)
- 0, 0 ! Enter numbers of 1st and last patterns to examine (0 0 for none)
- 2., 1.5 ! Enter desired standard deviation and mean of net functions
- 1 ! Enter batch size (1 for no batching, 0 for full batching)
- .01, .98 ! learning factor, momentum term
- 20, .001 ! Number of training iterations, threshold on MSE
- 4 ! 1 to continue, 2 to change weights, 3 for a new data file, 4 to stop
- 1 ! Enter 1 to perform feature selection, 0 else
- gls.wts ! filename for saving the weights
-
-
- The program will read all patterns from the file gls, and train a MLP
- using the network structure file gls.top, which is shown below.
-
- 4
- 4 5 2 1
- 1 1 1
-
- The network will have 4 layers including 4 inputs, 7 hidden units
- divided between 2 hidden layers, and 1 output. In addition, layers 2,
- 3, and 4 connect to all previous layers. Training will stop
- after 20 iterations, or when the training MSE reaches .001 .
- The final network weights will be stored in the file gls.wts.
- c. Exit the DOS editor and observe the program running
- d. Go to the "Examine Program Output" option and press <ret>
- e. You can run this program on your own data, simply by editing the
- parameter file in the "batch Run" option.
-
-
-